Stochastic Neural Networks and Their Solutions to Optimisation Problems
نویسنده
چکیده
Stochastic neural networks which are a type of recurrent neural networks can be basicly and simply expressed as “the neural networks which are built by introducing random variations into the network”. This randomness comes from one of these usages : applying stochastic transfer functions to network neurons or determining the network weights stochastically. This randomness property makes this type of neural networks an ideal tool for optimisation problems which especially cannot be solved with classical methods, or needs better solurions. In this paper, the properties of stochastic neural networks are investigated and the examples of using this type of neural networks for optimisation problems are given.
منابع مشابه
Neuro-Optimizer: A New Artificial Intelligent Optimization Tool and Its Application for Robot Optimal Controller Design
The main objective of this paper is to introduce a new intelligent optimization technique that uses a predictioncorrectionstrategy supported by a recurrent neural network for finding a near optimal solution of a givenobjective function. Recently there have been attempts for using artificial neural networks (ANNs) in optimizationproblems and some types of ANNs such as Hopfield network and Boltzm...
متن کاملLearning of B-spline Neural Network Using New Particle Swarm Approaches
New approaches of particle swarm optimisation algorithm based on Gaussian and Cauchy distributions to adjust the control points of B-spline neural networks are proposed. B-spline networks are trained by gradient-based methods, which may fall into local minimum during the learning procedure. To overcome the problems encountered by the conventional learning methods, particle swarm optimisation ...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملActivity-Conserving Dynamics for Optimisation Networks
A novel type of dynamics conserving activity in neural networks is presented. We distinguish stochastic and deterministic activity-conserving dynamics. As an example, deterministic (mean-eld) activity-conserving dynamics is applied to the N-queens problem. The new dynamics are more succesful in nding valid solutions than the standard dynamics.
متن کاملSemantically enabled process synthesis and optimisation
This paper introduces a new framework to support synthesis of complex engineering problems and which combines stochastic serach optimisation with ontological knowledge modelling. The framework uses Tabu search to generate new solutions and introduces the mechanism of digital certificate to translate between structural information of solutions and semantics of ontology. The solutions are respect...
متن کامل